Variational Bayesian Stochastic Complexity of Mixture Models

نویسندگان

  • Kazuho Watanabe
  • Sumio Watanabe
چکیده

The Variational Bayesian framework has been widely used to approximate the Bayesian learning. In various applications, it has provided computational tractability and good generalization performance. In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and provide some additional theoretical support by deriving the asymptotic form of the stochastic complexity. The stochastic complexity, which corresponds to the minimum free energy and a lower bound of the marginal likelihood, is a key quantity for model selection. It also enables us to discuss the effect of hyperparameters and the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Truncation-free Stochastic Variational Inference for Bayesian Nonparametric Models

We present a truncation-free stochastic variational inference algorithm for Bayesian nonparametric models. While traditional variational inference algorithms require truncations for the model or the variational distribution, our method adapts model complexity on the fly. We studied our method with Dirichlet process mixture models and hierarchical Dirichlet process topic models on two large data...

متن کامل

Stochastic Complexities of Gaussian Mixtures in Variational Bayesian Approximation

Bayesian learning has been widely used and proved to be effective in many data modeling problems. However, computations involved in it require huge costs and generally cannot be performed exactly. The variational Bayesian approach, proposed as an approximation of Bayesian learning, has provided computational tractability and good generalization performance in many applications. The properties a...

متن کامل

On Variational Bayes Algorithms for Exponential Family Mixtures

In this paper, we empirically analyze the behaviors of the Variational Bayes algorithm for the mixture model. While the Variational Bayesian learning has provided computational tractability and good generalization performance in many applications, little has been done to investigate its properties. Recently, the stochastic complexity of mixture models in the Variational Bayesian learning was cl...

متن کامل

Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression

This paper presents a novel variational inference framework for deriving a family of Bayesian sparse Gaussian process regression (SGPR) models whose approximations are variationally optimal with respect to the full-rank GPR model enriched with various corresponding correlation structures of the observation noises. Our variational Bayesian SGPR (VBSGPR) models jointly treat both the distribution...

متن کامل

An Alternative View of Variational Bayes and Minimum Variational Stochastic Complexity

Bayesian learning is widely used in many applied datamodelling problems and is often accompanied with approximation schemes since it requires intractable computation of the posterior distributions. In this study, we focus on the two approximation methods, the variational Bayes and the local variational approximation. We show that the variational Bayes approach for statistical models with latent...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005